Goto

Collaborating Authors

 Waterloo Region








Wittgenstein's Family Resemblance Clustering Algorithm

Amanpour, Golbahar, Ghojogh, Benyamin

arXiv.org Machine Learning

This paper, introducing a novel method in philo-matics, draws on Wittgenstein's concept of family resemblance from analytic philosophy to develop a clustering algorithm for machine learning. According to Wittgenstein's Philosophical Investigations (1953), family resemblance holds that members of a concept or category are connected by overlapping similarities rather than a single defining property. Consequently, a family of entities forms a chain of items sharing overlapping traits. This philosophical idea naturally lends itself to a graph-based approach in machine learning. Accordingly, we propose the Wittgenstein's Family Resemblance (WFR) clustering algorithm and its kernel variant, kernel WFR. This algorithm computes resemblance scores between neighboring data instances, and after thresholding these scores, a resemblance graph is constructed. The connected components of this graph define the resulting clusters. Simulations on benchmark datasets demonstrate that WFR is an effective nonlinear clustering algorithm that does not require prior knowledge of the number of clusters or assumptions about their shapes.


BLISS: Bandit Layer Importance Sampling Strategy for Efficient Training of Graph Neural Networks

Alsaqa, Omar, Hoang, Linh Thi, Balin, Muhammed Fatih

arXiv.org Machine Learning

Graph Neural Networks (GNNs) are powerful tools for learning from graph-structured data, but their application to large graphs is hindered by computational costs. The need to process every neighbor for each node creates memory and computational bottlenecks. To address this, we introduce BLISS, a Bandit Layer Importance Sampling Strategy. It uses multi-armed bandits to dynamically select the most informative nodes at each layer, balancing exploration and exploitation to ensure comprehensive graph coverage. Unlike existing static sampling methods, BLISS adapts to evolving node importance, leading to more informed node selection and improved performance. It demonstrates versatility by integrating with both Graph Convolutional Networks (GCNs) and Graph Attention Networks (GATs), adapting its selection policy to their specific aggregation mechanisms. Experiments show that BLISS maintains or exceeds the accuracy of full-batch training.



Quantum computers turned out to be more useful than expected in 2025

New Scientist

For the past year, I kept bringing the same story to my editor: quantum computers are on the edge of becoming useful for scientific discovery. Of course, that has always been the goal. The idea of using quantum computers to better understand our universe is part of their origin story, and it even featured in a 1981 speech by Richard Feynman. Contemplating the best way to simulate nature, he wrote: "We can give up on our rule about what the computer was, we can say: Let the computer itself be built of quantum mechanical elements which obey quantum mechanical laws." Today, Feynman's vision has been realised by Google, IBM and dozens more companies and academic teams. Their devices are now being used to simulate reality at the quantum level - and here are some highlights.